Kernel Isomap

نویسندگان

  • Heeyoul Choi
  • Seungjin Choi
چکیده

Isomap [4] is a manifold learning algorithm, which extends classical multidimensional scaling (MDS) by considering approximate geodesic distance instead of Euclidean distance. The approximate geodesic distance matrix can be interpreted as a kernel matrix, which implies that Isomap can be solved by a kernel eigenvalue problem. However, the geodesic distance kernel matrix is not guaranteed to be positive semidefinite. In this letter we employ a constant-adding method, which leads to the Mercer kernel-based Isomap algorithm. Numerical experimental results with noisy ”Swiss roll” data, confirm the validity and high performance of our kernel Isomap algorithm. Indexing terms: Embedding, Kernel PCA, Manifold learning, MDS, Nonlinear dimensionality reduction. Appeared in Electronics Letters vol. 40, no. 25, pp. 1612-1613, December 2004. Please address correspondence to Prof. Seungjin Choi, Department of Computer Science, POSTECH, San 31 Hyoja-dong, Nam-gu, Pohang 790-784, Korea, Tel: +82-54-279-2259, Fax: +82-54-279-2299, Email: [email protected]

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust kernel Isomap

Isomap is one of widely-used low-dimensional embedding methods, where geodesic distances on a weighted graph are incorporated with the classical scaling (metric multidimensional scaling). In this paper we pay our attention to two critical issues that were not considered in Isomap, such as: (1) generalization property (projection property); (2) topological stability. Then we present a robust ker...

متن کامل

Learning Manifolds in Forensic Data

Chemical data related to illicit cocaine seizures is analyzed using linear and nonlinear dimensionality reduction methods. The goal is to find relevant features that could guide the data analysis process in chemical drug profiling, a recent field in the crime mapping community. The data has been collected using gas chromatography analysis. Several methods are tested: PCA, kernel PCA, isomap, sp...

متن کامل

Discussion of "Spectral Dimensionality Reduction via Maximum Entropy"

Since the introduction of LLE (Roweis and Saul, 2000) and Isomap (Tenenbaum et al., 2000), a large number of non-linear dimensionality reduction techniques (manifold learners) have been proposed. Many of these non-linear techniques can be viewed as instantiations of Kernel PCA; they employ a cleverly designed kernel matrix that preserves local data structure in the “feature space” (Bengio et al...

متن کامل

Learning Eigenfunctions Links Spectral Embedding and Kernel PCA

In this letter, we show a direct relation between spectral embedding methods and kernel principal components analysis and how both are special cases of a more general learning problem: learning the principal eigenfunctions of an operator defined from a kernel and the unknown data-generating density. Whereas spectral embedding methods provided only coordinates for the training points, the analys...

متن کامل

Deep Autoencoders for Dimensionality Reduction of High-Content Screening Data

High-content screening uses large collections of unlabeled cell image data to reason about genetics or cell biology. Two important tasks are to identify those cells which bear interesting phenotypes, and to identify sub-populations enriched for these phenotypes. This exploratory data analysis usually involves dimensionality reduction followed by clustering, in the hope that clusters represent a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005